Current:Home > reviewsTradeEdge Exchange:Former TikTok moderators sue over emotional toll of 'extremely disturbing' videos -ProfitZone
TradeEdge Exchange:Former TikTok moderators sue over emotional toll of 'extremely disturbing' videos
TradeEdge View
Date:2025-04-10 23:45:30
When Ashley Velez accepted a job last year reviewing videos for TikTok,TradeEdge Exchange "we were told we would be the front line of defense from protecting children from seeing violence," she said.
But the Las Vegas mother of two boys, ages 8 and 17, said she was stunned when she discovered what the position entailed.
"We would see death and graphic, graphic pornography. I would see nude underage children every day," Velez said in an interview. "I would see people get shot in the face, and another video of a kid getting beaten made me cry for two hours straight."
Velez worked for TikTok from May to November 2021, one of some 10,000 content moderators worldwide who police videos on the platform, making sure it remains an endless feed of lighthearted content, rather than a cesspool of violent and disturbing videos.
Now, Velez and another former TikTok moderator, Reece Young, have filed a federal lawsuit seeking class action status against the video-sharing app and its parent company, ByteDance.
The Chinese-owned app is the envy of Silicon Valley social media giants and has more than 1 billion monthly active users. Its success depends in no small part on the work of moderators like Velez who toil behind the scenes to scrub TikTok of distressing content before the masses see it.
While the plight of moderators is often absent from fights over what content social media platforms allow and what they ban, there is a growing movement to hold tech giants accountable for the welfare of these workers on the front lines of that debate. On Thursday, Velez and Young sought to do just that.
"You see TikTok challenges, and things that seem fun and light, but most don't know about this other dark side of TikTok that these folks are helping the rest of us never see," said lawyer Steve Williams of the Joseph Saveri Law Firm, which filed the case.
Velez: "Somebody has to suffer and see this stuff"
Their lawsuit accuses TikTok of negligence and says it broke California labor laws by allegedly not protecting Velez and Young from the emotional trauma caused by reviewing hundreds of "highly toxic and extremely disturbing" videos every week, including videos of animal cruelty, torture and even the execution of children.
"Underage nude children was the plethora of what I saw," said Velez, who now works as an independent contractor for vacation-home rental site Boutiq. "People like us have to filter out the unsavory content. Somebody has to suffer and see this stuff so nobody else has to."
According to the suit, Young and Velez were exposed to an unsafe work environment because TikTok did not provide adequate mental health treatment to help deal with the anxiety, depression and post-traumatic stress associated with reviewing graphic videos.
Young and Velez were both contractors, not employees of TikTok. Young worked for the New York company Atrium; Velez was hired by Telus International, a publicly traded Canadian tech firm. The suit says TikTok and ByteDance controlled the day-to-day work of Young and Velez by directly tying their pay to how well they moderated content in TikTok's system and by pushing them to hit aggressive quota targets. Before they could start work, moderators had to sign non-disclosure agreements, the suit said, preventing them from discussing what they saw with even their families.
Moderators like Young and Velez are expected to review videos "for no longer than 25 seconds" and decide with more than 80% accuracy whether the content breaks one of TikTok's rules, according to suit. To meet quotas, the suit alleges, moderators often watch multiple videos at once.
Young and Velez were allowed two 15-minute breaks and a lunch hour over a 12-hour workday. If they took any other breaks, they risked losing pay, the suit says.
That amounts to punishing the content moderators by "making them extremely ill-equipped to handle the mentally devastating imagery their work required them to view without any meaningful counseling or meaningful breaks during their work," wrote lawyer Joseph Saveri and other attorneys for the plaintiffs in the suit.
A TikTok spokeswoman declined to comment on the lawsuit but said the company "strives to promote a caring working environment for our employees and contractors."
TikTok moderators, according to company, are offered "a range of wellness services so that moderators feel supported mentally and emotionally."
Telus International spokeswoman Jennifer Bach said in a statement that it "has a robust resiliency and mental health program in place to support all our team members, as well as a comprehensive benefits program for access to personal health and well-being services."
Velez said she did set up a meeting with a Telus counselor, who spoke to her for 30 minutes. "They saw so many people that it didn't seem like they had time to actually help you with what you were suffering with," she said. "It would have been nice if they would even acknowledge that the videos were causing a problem in the first place."
TikTok suit comes after $52 million Facebook settlement
Social media companies, including TikTok, use artificial intelligence to screen millions of videos for disturbing content, but the technology cannot catch everything, so human moderators remain critical to keeping the platforms safe.
"There's a hope that artificial intelligence can do all of this work, but that hope is not yet realized, so now humans mostly do this work," Williams said.
Yet people can act only so quickly. While moderators are pushed to act fast, they also are tasked with analyzing multiple aspects of each video, the suit states. There are now 100 "tags" — up from 20 tags — that moderators can use when indicating a video violates a rule, such as a tag that flags a video as showing a minor's torso, and moderators are also expected to analyze what's happening in the background of each video.
It is unclear when, exactly, the new moderation standards went into effect. Since Russia invaded Ukraine, TikTok has been under new pressure as it attempts to stay ahead of a flood of misleading videos about the war.
In the suit, lawyers for Young and Velez argue that their clients' psychological trauma stems not just from videos that are violent but also from those that spread conspiracy theories, including recordings suggesting that the COVID-19 pandemic is a fraud or denying the Holocaust ever happened.
The National Center for Missing & Exploited Children has developed best practices for content moderators who are exposed to images and videos of exploited minors. Those measures include blurring parts of an image or superimposing a grid over the image to lessen its potential emotional impact on the people reviewing it.
The suit says that although TikTok is one of the center's corporate partners, along with Google and Facebook, it has not adopted the group's recommendations, instead focusing on hitting video-review quotas above all else. A spokeswoman for TikTok did not return a request for comment on the allegation.
The lawyers representing Velez and Young sued Facebook several years ago on behalf of thousands of moderators who said they experienced emotional distress on the job. In May 2020, Facebook agreed to settle the suit for $52 million. Individual moderators who were part of the class action were eligible for at least $1,000, depending on the severity of their emotional disorders related to their Facebook moderation work.
The same lawyers filed a similar suit against TikTok in December on behalf of moderator Candie Frazier. It was dropped last month, before a settlement could be reached, after her lawyers say she was fired.
veryGood! (34823)
Related
- EU countries double down on a halt to Syrian asylum claims but will not yet send people back
- Travis James Mullis executed in Texas for murder of his 3-month-old son Alijah: 'I'm ready'
- Abercrombie’s Secret Sale Has Tons of Fall Styles & Bestsellers Starting at $11, Plus an Extra 25% Off
- Levi's teases a Beyoncé collaboration: 'A denim story like never before'
- Israel lets Palestinians go back to northern Gaza for first time in over a year as cease
- Deion Sanders, Colorado's 'Florida boys' returning home as heavy underdogs at Central Florida
- Funds are cutting aid for women seeking abortions as costs rise
- WNBA playoff games today: What to know for Sun vs. Fever, Lynx vs. Mercury on Wednesday
- Federal court filings allege official committed perjury in lawsuit tied to Louisiana grain terminal
- It's a new world for college football players: You want the NIL cash? Take the criticism.
Ranking
- 'Survivor' 47 finale, part one recap: 2 players were sent home. Who's left in the game?
- Abercrombie’s Secret Sale Has Tons of Fall Styles & Bestsellers Starting at $11, Plus an Extra 25% Off
- Travis James Mullis executed in Texas for murder of his 3-month-old son Alijah: 'I'm ready'
- En busca de soluciones para los parques infantiles donde el calor quema
- Jamie Foxx reps say actor was hit in face by a glass at birthday dinner, needed stitches
- NFL rookie rankings: Jayden Daniels or Malik Nabers for No. 1 of early 2024 breakdown?
- Demi Lovato doesn’t remember much of her time on Disney Channel. It's called dissociation.
- Crazy Town frontman Shifty Shellshock's cause of death revealed
Recommendation
House passes bill to add 66 new federal judgeships, but prospects murky after Biden veto threat
Monsters: The Lyle and Erik Menendez Story Stars React to Erik Menendez’s Criticism
Another Outer Banks home collapses into North Carolina ocean, the 3rd to fall since Friday
Woman arrested for burglary after entering stranger’s home, preparing dinner
NHL in ASL returns, delivering American Sign Language analysis for Deaf community at Winter Classic
US public schools banned over 10K books during 2023-2024 academic year, report says
Anna Sorokin eliminated from ‘Dancing With the Stars’ in first round of cuts
Alabama Jailer pleads guilty in case of incarcerated man who froze to death